4 research outputs found

    Optimization and Abstraction: A Synergistic Approach for Analyzing Neural Network Robustness

    Full text link
    In recent years, the notion of local robustness (or robustness for short) has emerged as a desirable property of deep neural networks. Intuitively, robustness means that small perturbations to an input do not cause the network to perform misclassifications. In this paper, we present a novel algorithm for verifying robustness properties of neural networks. Our method synergistically combines gradient-based optimization methods for counterexample search with abstraction-based proof search to obtain a sound and ({\delta}-)complete decision procedure. Our method also employs a data-driven approach to learn a verification policy that guides abstract interpretation during proof search. We have implemented the proposed approach in a tool called Charon and experimentally evaluated it on hundreds of benchmarks. Our experiments show that the proposed approach significantly outperforms three state-of-the-art tools, namely AI^2 , Reluplex, and Reluval

    Using Interval Constraint Propagation for Pseudo-Boolean Constraint Solving

    No full text
    Abstract-This work is motivated by (1) a practical application which automatically generates test patterns for integrated circuits and (2) the observation that off-the-shelf state-of-the-art pseudoBoolean solvers have difficulties in solving instances with huge pseudo-Boolean constraints as created by our application. Derived from the SMT solver iSAT3 we present the solver iSAT3p that on the one hand allows the efficient handling of huge pseudo-Boolean constraints with several thousand summands and large integer coefficients. On the other hand, experimental results demonstrate that at the same time iSAT3p is competitive or even superior to other solvers on standard pseudo-Boolean benchmark families
    corecore